Boosting with structural sparsity: A differential inclusion approach
نویسندگان
چکیده
منابع مشابه
Speed and Sparsity of Regularized Boosting
Boosting algorithms with l1-regularization are of interest because l1 regularization leads to sparser composite classifiers. Moreover, Rosset et al. have shown that for separable data, standard lpregularized loss minimization results in a margin maximizing classifier in the limit as regularization is relaxed. For the case p = 1, we extend these results by obtaining explicit convergence bounds o...
متن کاملStructural Data Recognition with Graph Model Boosting
This paper presents a novel method for structural data recognition using a large number of graph models. In general, prevalent methods for structural data recognition have two shortcomings: 1) Only a single model is used to capture structural variation. 2) Naive recognition methods are used, such as the nearest neighbor method. In this paper, we propose strengthening the recognition performance...
متن کاملSparsity priors and boosting for learning localized distributed feature representations
This technical report presents a study of methods for learning sparse codes and localized features from data. In the context of this study, we propose a new prior for generating sparse image codes with low-energy, localized features. The experiments show that with this prior, it is possible to encode the model with significantly fewer bits without affecting accuracy. The report also introduces ...
متن کاملA relaxation theorem for a differential inclusion with ”maxima”
We consider a Cauchy problem associated to a nonconvex differential inclusion with ”maxima” and we prove a Filippov type existence result. This result allows to obtain a relaxation theorem for the problem considered.
متن کاملAppendix: Sharing Features in Multi-class Boosting via Group Sparsity
In this document we provide a complete derivation for multi-class boosting with group sparsity and a full explanation of admm algorithm presented in the main paper. 1 Multi-class boosting with group sparsity We first provide the derivation for multi-class logistic loss with 1,2-norm. We then show the difference between our boosting with 1,2-norm and 1,∞-norm. We then briefly discuss our group s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied and Computational Harmonic Analysis
سال: 2020
ISSN: 1063-5203
DOI: 10.1016/j.acha.2017.12.004